Riemannian geometry on positive definite matrices
نویسنده
چکیده
The Riemannian metric on the manifold of positive definite matrices is defined by a kernel function φ in the form K D(H,K) = ∑ i,j φ(λi, λj) −1TrPiHPjK when ∑ i λiPi is the spectral decomposition of the foot point D and the Hermitian matrices H,K are tangent vectors. For such kernel metrics the tangent space has an orthogonal decomposition. The pull-back of a kernel metric under a mapping D 7→ G(D) is a kernel metric as well. Several Riemannian geometries of the literature are particular cases, for example, the Fisher-Rao metric for multivariate Gaussian distributions and the quantum Fisher information. In the paper the case φ(x, y) =M(x, y) is mostly studied when M(x, y) is a mean of the positive numbers x and y. There are results about the geodesic curves and geodesic distances. The geometric mean, the logarithmic mean and the root mean are important cases. AMS classification: 15A45; 15A48; 53B21; 53C22
منابع مشابه
Riemannian Metric Learning for Symmetric Positive Definite Matrices
Over the past few years, symmetric positive definite matrices (SPD) have been receiving considerable attention from computer vision community. Though various distance measures have been proposed in the past for comparing SPD matrices, the two most widely-used measures are affine-invariant distance and log-Euclidean distance. This is because these two measures are true geodesic distances induced...
متن کاملSymmetric Positive-Definite Matrices: From Geometry to Applications and Visualization
In many engineering applications that use tensor analysis, such as tensor imaging, the underlying tensors have the characteristic of being positive definite. It might therefore be more appropriate to use techniques specially adapted to such tensors. We will describe the geometry and calculus on the Riemannian symmetric space of positive-definite tensors. First, we will explain why the geometry,...
متن کاملA Geometry Preserving Kernel over Riemannian Manifolds
Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...
متن کاملWasserstein Riemannian Geometry of Positive-definite Matrices∗
The Wasserstein distance on multivariate non-degenerate Gaussian densities is a Riemannian distance. After reviewing the properties of the distance and the metric geodesic, we derive an explicit form of the Riemannian metrics on positive-definite matrices and compute its tensor form with respect to the trace scalar product. The tensor is a matrix, which is the solution of a Lyapunov equation. W...
متن کاملGeometric distance and mean for positive semi-definite matrices of fixed rank
This paper introduces a new distance and mean on the set of positive semi-definite matrices of fixed-rank. The proposed distance is derived from a well-chosen Riemannian quotient geometry that generalizes the reductive geometry of the positive cone and the associated natural metric. The resulting Riemannian space has strong geometrical properties: it is geodesically complete, and the metric is ...
متن کاملOn approximating the Riemannian 1-center
We generalize the Euclidean 1-center approximation algorithm of Bădoiu and Clarkson (2003) to arbitrary Riemannian geometries, and study the corresponding convergence rate. We then show how to instantiate this generic algorithm to two particular settings: (1) the hyperbolic geometry, and (2) the Riemannian manifold of symmetric positive definite matrices.
متن کامل